Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP A. Proofs for Realizable Setting

ثبت نشده
چکیده

A. Proofs for Realizable Setting Proof of Lemma 3. Let ∆ := ŵ −w∗ be the difference between the true answer and solution to the optimization problem. Let S to be the support of w∗ and let S = [d] \ S be the complements of S. Consider the permutation i1, . . . , id−k of S for which |∆(ij)| ≥ |∆(ij+1)| for all j. That is, the permutation dictated by the magnitude of the entries of ∆ outside of S. We split S into subsets of size k according to this permutation: Define Sj , for j ≥ 1 as {i(j−1)k+1, . . . , ijk}. For convenience we also denote by S01 the set S ∪ S1. Now, consider the matrix XS01 ∈ Rt×|S01| whose columns are those of X with indices S01. The Restricted Isometry Property of X dictates that for any vector c ∈ R01 ,

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP

Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs the squared loss. The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight. Without any assumpt...

متن کامل

Variable Inclusion and Shrinkage Algorithms

The Lasso is a popular and computationally efficient procedure for automatically performing both variable selection and coefficient shrinkage on linear regression models. One limitation of the Lasso is that the same tuning parameter is used for both variable selection and shrinkage. As a result, it typically ends up selecting a model with too many variables to prevent over shrinkage of the regr...

متن کامل

The Performance of Group Lasso for Linear Regression of Grouped Variables

The lasso [19] and group lasso [23] are popular algorithms in the signal processing and statistics communities. In signal processing, these algorithms allow for efficient sparse approximations of arbitrary signals in overcomplete dictionaries. In statistics, they facilitate efficient variable selection and reliable regression under the linear model assumption. In both cases, there is now ample ...

متن کامل

Isometry-enforcing Data Transformations for Improving Sparse Model Learning

Imposing sparsity constraints (such as l1-regularization) on the model parameters is a practical and efficient way of handling very high-dimensional data, which also yields interpretable models due to embedded feature-selection. Compressed sensing (CS) theory provides guarantees on the quality of sparse signal (in our case, model) reconstruction that relies on the so-called restricted isometry ...

متن کامل

Sparse partial least squares for on-line variable selection in multivariate data streams

In this paper we propose a computationally efficient algorithm for on-line variable selection in multivariate regression problems involving high dimensional data streams. The algorithm recursively extracts all the latent factors of a partial least squares solution and selects the most important variables for each factor. This is achieved by means of only one sparse singular value decomposition ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017